11. Tensorflow Softmax
TensorFlow Softmax
In the previous lesson you built a softmax function from scratch. Now let's see how softmax is done in TensorFlow.
x = tf.nn.softmax([2.0, 1.0, 0.2])
Easy as that! tf.nn.softmax()
implements the softmax function for you. It takes in logits and returns softmax activations.
Quiz
Use the softmax function in the quiz below to return the softmax of the logits.
Start Quiz:
# Solution is available in the other "solution.py" tab
import tensorflow as tf
def run():
output = None
logit_data = [2.0, 1.0, 0.1]
logits = tf.placeholder(tf.float32)
# TODO: Calculate the softmax of the logits
# softmax =
with tf.Session() as sess:
# TODO: Feed in the logit data
# output = sess.run(softmax, )
return output
# Quiz Solution
# Note: You can't run code in this tab
import tensorflow as tf
def run():
output = None
logit_data = [2.0, 1.0, 0.1]
logits = tf.placeholder(tf.float32)
softmax = tf.nn.softmax(logits)
with tf.Session() as sess:
output = sess.run(softmax, feed_dict={logits: logit_data})
return output
Quiz
Answer the following 2 questions about softmax.